Christopher Chan (cec272) & Harrison Hidalgo (hh559)
Design with Embedded Operating Systems, Fall 2020
Harrison is a Master of Engineering student in Mechanical Engineering at Cornell University. He is interested in dynamics, controls, and robotics. Harrison is also interested in the design process, starting from setting system requirements to fabrication. He enjoys bringing these different elements together to create fun and interesting projects.
Chris is a Master of Engineering student in Mechanical Engineering at Cornell University. He enjoys all aspects of engineering, from design to prototyping and manufacturing. Chris enjoy hands-on experience and creating projects that can potentially make a difference (plus they're cool!). He is passionate in crossroads of aerospace, robotics, and autonomous vehicles.
Our objective was to create a mechanism which causes you to make a basket whenever you throw a basketball at a hoop. To do this, we used computer vision techniques to process a live feed of a basketball shot and used the Raspberry Pi to move the backboard using a series of filtering and controls algorithms. This project involved computer vision, path estimation, dynamics, motor control and use of a multi-thread operating system.
To accomplish this we first made a static basketball hoop, over this we mounted a movable backboard. The backboard is controlled by a Raspberry Pi (RPi). The RPi is connected to a camera which can sense where an incoming basketball is. When the camera senses an incoming basketball, the RPi reacts causing the backboard to orient itself such that the ball bounces off of it into the hoop.
The backboard would be able to rotate about two different axises. We actuated these with either motors.As the camera would need to pick up several frames of the ball moving, calculations were be done to predict ball trajectory and calculate the correct inputs to the backboard. This project was heavily inspired by Stuff Made Here’s Automatic Hoop; however, we will explore if we can make a $100 version of this channel’s several thousand dollar project.
How to Set Up OpenCV 4 on Raspberry Pi for Face Detection | Raspberry Pi
How to Set Up Image Processing With OpenCV on Raspberry Pi | Raspberry Pi
Face Detection in 2 Minutes using OpenCV & Python
Overview | Adafruit LSM6DS33 6-DoF IMU Breakout | Adafruit
Adafruit TB6612 1.2A DC/Stepper Motor Driver Breakout Board
Calculator to compute the Distance or Size of an Object in a photo Image
In order to track our ball, we used computer vision algorithms from the OpenCV python package. Our system used only 1 Raspberry Pi camera module connected to the Pi’s CSI port. Using this module, we found the ball’s position and velocity with respect to the camera’s x, y, and z coordinates. Our image processing algorithms are heavily influenced from this tutorial. Our camera was mounted in behind the backboard, with the following coordinate systems:
We used the OpenCV package to continuously analyze individual frames from the camera’s 30 fps video feed. Each frame is downsized to 640 X 480 pixels to improve processing speed. We applied a Gaussian blur over the image to effectively apply a low pass filter that reduces noise and improves smoothing to aid recognition of the spherical ball. At first, we attempted to create a Haar feature-based cascade classifier for basketballs. We trained a classifier from over 1000 negative images from imagenet’s room dataset, and 500 positive images taken from our basketball. Unfortunately, our classifier did not function well, as there were too many false positives. We believe this is due to our inadequate dataset and the lack of many unique features on the basketball. The tracking method that we used in the final product analyzes the hue, saturation, value (HSV) colorspace of the entire image. Since the ball is a relatively unique color, we identify upper and lower HSV thresholds to segment the ball. After masking the image in the HSV range, we check for contours and create enclosing circles. The ball is identified if the enclosing circle is larger than a specified radius, and we save the radius and coordinates of the center. Our algorithm functioned well and was relatively fast, even when no particular attention was made to declutter or color coordinate our clothes or the background. In our final product, we painted the ball green and adjusted the corresponding HSV bounds for even greater reliability.
Our HSV segmentation code is shown below:
image = frame.array
# Resize the frame, blur it, and convert to HSV color space
blurred = cv2.GaussianBlur(image, (11, 11), 0)
hsv = cv2.cvtColor(image, cv2.COLOR_BGR2HSV)
# Construct a mask for the color "orange", then perform a series of
# dilations and erosions to remove any small blobs left in the mask
mask = cv2.inRange(hsv, colorLower, colorUpper)
mask = cv2.erode(mask, None, iterations=2)
mask = cv2.dilate(mask, None, iterations=2)
# Find contours in the mask and initialize the current (x,y) center
# of the ball
cnts = cv2.findContours(mask.copy(), cv2.RETR_EXTERNAL, cv2.CHAIN_APPROX_SIMPLE)
cnts = imutils.grab_contours(cnts)
center = None
# Only proceed if at least one contour was found
if len(cnts) > 0:
# Find the largest contour in the mask, then use it to compute
# the minimum enclosing circle and centroid
c = max(cnts, key=cv2.contourArea)
((x, y), radius) = cv2.minEnclosingCircle(c)
M = cv2.moments(c)
center = (int(M["m10"] / M["m00"]), int(M["m01"] / M["m00"]))
# Only proceed if the radius meets a minimum size
if radius > 20:
# draw the circle and centroid on the frame, then update the
# list of tracked points
ballDetected = True
cv2.circle(image, (int(x), int(y)), int(radius), (0, 255, 255), 2)
cv2.circle(image, center, 5, (0, 0, 255), -1)
ball_dict['location'].append([x, y, radius])
ball_dict['time'].append(time.time())
Below we can see how our ball-recognition software functions:
For position estimation, the main challenge was converting from the pixel units to distance units (meters). We used a fixed reference value, the known radius of the ball, and the measured radius in pixel to create a pixel to distance ratio, which is used as a conversion factor to find the ball’s center in distance units. This works for the (x, y) coordinates, but in order to measure depth, the z-coordinate, we used a formula derived from the thin lens equation and our parameters, the image magnification, sensor height, and lens focal length. The velocities simply calculated using a first-order finite difference method between the last two distances calculated.
# calculate x
x_m = x*r_ball/r # (mm)
x0_m = x0*r_ball/r0 # (mm)
v_x = (x_m-x0_m)/((t-t0)*1000) # (m/s)
# calculate y
y_m = y*r_ball/r # (mm)
y0_m = y0*r_ball/r0 # (mm)
v_y = (y_m-y0_m)/((t-t0)*1000) # (m/s)
# calculate z
z_m = (f_lens*r_ball*camera.resolution[1])/(r*h_sensor*1000) # (m)
z0_m = (f_lens*r_ball*camera.resolution[1])/(r0*h_sensor*1000) # (m)
v_z = (z0_m-z_m)/(t-t0) # (m/s)
Using the htop command, we found that the camera-based app tracking takes a significant amount of processing power, so we created a program, camera_transmitter.py that runs this code in the background, and exports the ball’s states into a .csv file. Our main program then reads from this .csv file, so our control loop timing (besides state estimation error) is not affected by interruptions in camera_transmitter.py.
Christopher Chan contributed to this section of the project
The backboard was designed to have 3 degrees of freedom to allow rotation about the backboard’s y- and z-axes and translation about the board’s x-axis. Since 3 noncollinear points define a plane, the backboard is connected to the baseboard via 3 motor-driven linkages. Gears translate motor shaft’s rotation into translation, and ball joints translate translation to backboard rotation. We chose to use stepper motors because we can create an interface to specify the number of desired steps. The motors are also able to be “locked” by energizing the coils, which is advantageous for keeping the backboard still when the ball makes a collision. These motors are driven by motor drivers, which each utilize 4 GPIO pins. One characteristic of stepper motors is that we are not able to determine the actual position of the drive shaft with out additional hardware. We included limit switches on each of the geared-linkages and use the limit switches to detect when the linkages make contact with the base board. Then, we can determine the motor position relative to the backboard. Our design also includes an IMU, which we planned to implement for closed-loop control of the backboard angle. However, through our testing, we found that open-loop control was sufficient for backboard control. Our Raspberry Pi GPIO configuration is shown below:
GPIO Pin Use
________ ____________
"GPIO5" {'Limit M1'}
"GPIO6" {'M1 B1' }
"GPIO13" {'M1 B2' }
"GPIO19" {'M1 A1' }
"GPIO26" {'M1 A2' }
"GPIO15" {'Limit M2'}
"GPIO12" {'M2 A1' }
"GPIO16" {'M2 A2' }
"GPIO20" {'M2 B1' }
"GPIO21" {'M2 B2' }
"GPIO14" {'Limit M3'}
"GPIO11" {'M3 A2' }
"GPIO13" {'M3 A1' }
"GPIO15" {'M3 B1' }
"GPIO23" {'M3 B2' }
"GPIO2" {'IMU SDA' }
"GPIO3" {'IMU SCL' }
We created a CAD design, which allowed us to verify our linkage design and use 3-D printing and laser cutting to form our components. We iterated through two versions of our design, moving our hoop to a fixed-point with respect to the base board, and creating gear trains that allow for torque amplification from the drive shafts.
The final bill-of-materials is shown below:
Description Unit Cost Type of Unit Total Cost
_________________________ __________ ______________ __________
{'Basketball Rim + Ball'} {'$9.95' } {'1 Pack' } {'$9.95' }
{'IMU' } {'$9.95' } {'1 Pack' } {'$9.95' }
{'PiCamera' } {'$12.99'} {'1 Pack' } {'$12.99'}
{'Wood' } {'$4.29' } {'1/8"X2"X4"'} {'$4.29' }
{'Rod End' } {'$1.55' } {'1 Pack' } {'$9.30' }
{'5/16-18 Hex Nut' } {'$3.70' } {'100 Pack' } {'$3.70' }
{'Stepper Motor' } {'$23.99'} {'3 Pack' } {'$23.70'}
{'Limit Switch' } {'$7.99' } {'10 Pack' } {'$7.99' }
{'Motor Driver' } {'$4.95' } {'1 Pack' } {'$14.85'}
{0×0 char } {0×0 char} {0×0 char } {0×0 char}
{'Total' } {0×0 char} {0×0 char } {'$97.01'}
We constructed the physical backboard after several iterations of 3-D printing and laser cutting. When we built the physical backboard, we found that due to our tolerances and limited access to build facilities, there was some play in the linkages and UV-stabilizer joint. Our ball-socket joints also had a limited range of motion, which decreased our controllability of the system.
Below is a video explaining our backboard build:
Please click here for the non-embedded video tour of the backboard
Christopher Chan contributed to this section of the project
To process our data, we created two different filters. We started by creating an Extended Kalman Filter, or an EKF. An EKF is an algorithm that acts very similar to a traditional Kalman Filter. Where they differ is in the types of models they can analyze. A standard Kalman Filter uses linear models, an EKF extends the Kalman Filter to be able to analyze non-linear models. It does this by utilizing a linearization of the non-linear dynamics at the previous estimate of the system. This EKF was simple and quick to put together for preliminary testing. For our final design we strove to design and implement a square-root, sigma point filter, also known as a square-root unscented filter. A sigma point filter, like the EKF, can be used to analyze non-linear data. This is done by fitting gaussian probabilities to non-linear data. First, several points are generated, these points are called sigma points. These points are distributed based on the predicted covariances of the data. Using this system, we assume gaussian process and sensor noise. These points are then fed into the non-linear dynamics of the system. These dynamics scramble the probability distribution. To cope with this, we then fit a new gaussian to the system based on the locations of the sigma points, to compute this we use a system of weights on each sigma-point. We can then use this new probability distribution to process our data with a Kalman Filter. Some advantages of the sigma-point filter are a higher level of accuracy compared to the EKF and computation of the system jacobian is unnecessary. Disadvantages of the sigma-point filter are a higher level of computation and the use of unintuitive design variables. To simplify the use of design variables, we used a weight system that reduced the number of design variables from three to one.
Sigma Point Filter Depiction
A problem that can occur with real time application of Kalman Filters is the lack of robustness of the computation system. Covariances can grow to sizes where numerical errors can lead to disturbances in accuracy. To prevent this we used a square-root variant of the sigma point filter. Instead of working with covariance matrices, the square-root variant uses a cholesky decomposition to take the square root of our covariance matrix. This adds to computation and brings in the need of post processing to get covariance data but we thought that the higher level of robustness was necessary as our system would be greatly affected by errors in system state prediction. We also found that the square-root filter is highly dependent on the definiteness of the covariance matrices. If the covariance matrix goes negative definite, the filter will fail to predict future states. Due to this, it rendered the square root sigma-point filter to be unusable without highly accurate covariance matrices for noise and the prior.
Below are our Extendended Kalman Filter and Square-Root Sigma Point Filter:
# Harrison Hidalgo and Christopher Chan
# ECE 5725 - Final Project
#
import numpy as np
from runge_kutta import *
import Physical_Variables as p
def EKF_filter(x_hat_p,Pkk,measurement,Q,R,del_t,t0):
F = [[np.asscalar(x_hat_p[None,3, 0])*del_t,0,0,0,0,0],[0,np.asscalar(x_hat_p[None,4, 0])*del_t,0,0,0,0],[0,0,np.asscalar(x_hat_p[None,5, 0])*del_t,0,0,0],[0,0,0,0,0,0],[0,0,0,0,-p.g,0],[0,0,0,0,0,0]]
G = np.identity(6)
H = np.identity(6)
# Predict
t, x_hat_predict = runge_kutta(del_t,t0,x_hat_p)
Pk2k = np.matmul(np.matmul(F,Pkk),np.transpose(F)) + np.matmul(np.matmul(G,Q),np.transpose(G))
# Kalman Gain
K = np.matmul(np.matmul(Pk2k,H),np.linalg.inv(np.matmul(np.matmul(H,Pk2k),np.transpose(H))+R))
# Update
x_hat = x_hat_predict + np.matmul(K,(measurement-x_hat_predict))
Pk2k2 = np.matmul(np.matmul((np.identity(6)-np.matmul(K,H)),Pk2k),np.transpose(np.identity(6))-np.matmul(K,H)) + np.matmul(np.matmul(K,R),np.transpose(K))
return x_hat, Pk2k2
# Harrison Hidalgo
# ECE 5725 - Final Project
# Square-Root Sigma Point Kalman Filter Ball
### Done before: initialize
import numpy as np
from ball_calc import *
from runge_kutta import *
import weights
import math
def SR_SPF_Ball(x_0,S_x0,S_v0,S_n0,n_sig,measurement,dt):
# S_v0 is process noise
# S_n0 is sensor noise
H = np.identity(6)
## Create sigma points
nx = len(x_0)
ensp = np.ones((1,nx*2+1));
sigma_points=np.matmul(x_0,ensp)+np.concatenate((np.zeros((nx,1)),n_sig*S_x0,-n_sig*S_x0),axis=1);
## Predict
X_p1 = np.zeros((nx,nx*2+1))
time,X_p1[:,None,0]=runge_kutta(dt,0,sigma_points[:,None,0])
x_p1=X_p1[:,None,0]*weights.wm0;
for i in range(0,nx):
time,X_p1[:,None,i+1] = runge_kutta(dt,0,sigma_points[:,None,1+i]);
time,X_p1[:,None,i+1+nx] = runge_kutta(dt,0,sigma_points[:,None,1+i+nx]);
x_p1 = weights.wm*(X_p1[:,None,i+1]+X_p1[:,None,i+1+nx])+x_p1;
B_k=np.zeros((2*nx,nx));
for i in range(0,2*nx):
B_k[i,None,:] = np.sqrt(weights.wc)*np.transpose(X_p1[:,None,i+1]-x_p1)
B_k = np.concatenate((B_k,np.transpose(S_v0)),axis=0)
S_x1,r = np.linalg.qr(np.transpose(B_k))
#S_x1 = np.linalg.cholesky(S_x1+np.sign(weights.wc0)*np.matmul((X_p1[:,None,0]-x_p1),np.transpose(X_p1[:,None,0]-x_p1)))
S_x1 = cholupdate(S_x1,X_p1[:,None,0]-x_p1,np.sign(weights.wc0))
print('chol 1')
print(S_x1)
X_p2 = np.zeros((nx,nx*2+1))
X_p2[:,None,0]=x_p1;
## Re create sigma points
X_p2=np.matmul(x_p1,ensp)+np.concatenate((np.zeros((nx,1)),n_sig*S_x1,-n_sig*S_x1),axis=1);
#### Update
Y_k=np.zeros((nx,2*nx+1));
Y_k[:,None,0] = np.matmul(H,X_p2[:,None,0])
y_k = Y_k[:,None,0]*weights.wm0
for i in range(1,2*nx+1):
Y_k[:,None,i]= np.matmul(H,X_p2[:,None,i])
y_k=y_k+Y_k[:,None,i]*weights.wm;
C_k=np.zeros((2*nx,nx))
for i in range(0,2*nx):
C_k[i,None,:]=weights.wc*np.transpose((Y_k[:,None,i+1]-y_k));
C_k=np.concatenate((C_k,np.transpose(S_n0)),axis=0);
#print('C_k')
#print(C_k)
S_yk,r=np.linalg.qr(np.transpose(C_k));
#S_yk=np.linalg.cholesky(S_yk+np.matmul((Y_k[:,None,0]-y_k),np.transpose((Y_k[:,None,0]-y_k)))*np.sign(weights.wc0))
S_yk=cholupdate(S_yk,Y_k[:,None,0]-y_k,np.sign(weights.wc0));
P_xy = weights.wc0*np.matmul((X_p2[:,None,0]-x_p1),np.transpose(Y_k[:,None,0]-y_k))
for i in range(0,2*nx+1):
P_xy=weights.wc* np.matmul((X_p2[:,None,i]-x_p1),np.transpose(Y_k[:,None,i]-y_k))+P_xy
Kalman_Gain=np.matmul(np.matmul(P_xy,np.transpose(S_yk)),S_yk);
innovation=measurement-np.matmul(H,x_p1)
x_hat=x_p1+np.matmul(Kalman_Gain,innovation)
KF_gain_cov = np.matmul(Kalman_Gain,P_xy)
#S_xk = np.linalg.cholesky(S_x1+np.matmul(KF_gain_cov[:,None,0],np.transpose(KF_gain_cov[:,None,0]))*-1)
S_xk = cholupdate(S_x1,KF_gain_cov[:,None,0],-1)
for i in range(1,nx):
S_xk = cholupdate(S_xk,KF_gain_cov[:,None,i],-1);
return x_hat,S_xk
def cholupdate(R,x,sign):
#R = validateCovMatrix(R)
p = len(x)
for k in range(0,p):
#print('first')
#print(R[k,None,k]**2)
#print('sec')
#print(sign*x[k]**2)
r = np.sqrt(R[k,None,k]**2 + sign*x[k]**2)
c = r/R[k,None,k]
s = x[k]/R[k,None,k]
R[k,None,k] = r
if k<p:
R[(k+1):p,None,k] = (R[k+1:p,None,k] + sign*s*x[(k+1):p])/c
x[(k+1):p]= c*x[(k+1):p] - s*R[(k+1):p,None,k]
R = validateCovMatrix(R)
return R
def validateCovMatrix(sig):
EPS = 10**-6
ZERO= 10**-10
sigma = sig
if (is_pos_def(sigma) == False):
w,v = np.linalg.eig(sigma)
for n in range(0,len(w)):
if (v[n,n] <= ZERO):
v[n,n] = EPS
sigma = w*v*np.transpose(w)
return sigma
def is_pos_def(x):
return np.all(np.linalg.eigvals(x) > 0)
A problem that we ran into was the identification of targets. Sometimes our openCV data would register false positives of balls in the frame. To prevent these false positives from damaging our data we implemented a 95% validation gate to our data. This gate would run a hypothesis test with an alpha value of 0.05. If our sample passed this test we would include it in our model, otherwise we would ignore it and move to the next measurement. A future way to improve this system would be to establish a more robust birth and death algorithm for our system. This way we could more accurately track individual targets and be able to track multiple targets.
# Harrison Hidalgo
# ECE 5725 - Final Project
# This program confirms or rejects measurements. Returns true if accepts, false if it rejects.
def measurement_validation(measurement,P,del_t,lam0,R,x_hat):
import numpy as np
# Find innovation and covariance
H = np.identity(6)
innovation = measurement - np.matmul(H,x_hat)
S = np.matmul(np.matmul(H,P),np.transpose(H))+R
# Find normalized innovation squared
lam = np.matmul(np.matmul(np.transpose(innovation),np.linalg.inv(S)),innovation)
# Accept or reject measurement
if lam < lam0:
acceptance = True
else:
acceptance = False
return acceptance
Harrison Hidalgo contributed to this section of the project
Once our system had converged to small enough covariances we could begin actuation of our backboard. To do this our system had to run through several programs designed to locate the optimal orientation of the board. We broke this up into several steps: mapping the ball’s position from camera coordinates to backboard coordinates, calculating possible backboard orientations that would result in a basket, then mapping these different orientations to different motor configurations, at this point we would find the best motor orientation by using mean-square error from the current configuration. We could then feed this orientation into our control loop where the motors would begin actuation.
# Harrison Hidalgo
# ECE 5725
# This script returns True if covariances are small enough.
def covariances_small_enough(S,min_covariances):
import numpy as np
import math
#P = np.matmul(S,np.transpose(S))
is_it = True
for i in range(0,len(S)):
if min_covariances[i] < math.sqrt(S[i,i]):
is_it=False
return is_it
Mapping the ball coordinates to baseboard coordinates involved a transformation matrix from the camera coordinates to the baseboard coordinates. We then needed to shift the ball’s location from relative distance from the camera to relative distance from the baseboard. At this point our system was ready to find the appropriate backboard orientation.
# Harrison Hidalgo and Christopher Chan
# ECE 5725 - Final Project
# This module contains all of our transforms.
def transform_baseboard_to_backboard(theta,phi,psi,x):
#
# Parameters:
# theta: rotation about baseboard z axis
# phi: rotation about baseboard y axis
# psi: rotation about baseboard x axis
# x: vector to transform
# Output:
# x_new: transformed vector
import numpy as np
import math
R = np.matmul(np.matmul([[math.cos(theta),-math.sin(theta),0],[math.sin(theta),math.cos(theta),0],[0,0,1]],[[math.cos(phi),0,math.sin(phi)],[0,1,0],[-math.sin(phi),0,math.cos(phi)]]),[[1,0,0],[0,math.cos(psi),-math.sin(psi)],[0,math.sin(psi),math.cos(psi)]])
x_new = np.matmul(R,x)
return x_new
def transform_backboard_to_baseboard(theta,phi,psi,x):
#
# Parameters:
# theta: rotation about baseboard z axis
# phi: rotation about baseboard y axis
# psi: rotation about baseboard x axis
# x: vector to transform
# Output:
# x_new: transformed vector
import numpy as np
import math
R = np.matmul(np.matmul([[math.cos(theta),-math.sin(theta),0],[math.sin(theta),math.cos(theta),0],[0,0,1]],[[math.cos(phi),0,math.sin(phi)],[0,1,0],[-math.sin(phi),0,math.cos(phi)]]),[[1,0,0],[0,math.cos(psi),-math.sin(psi)],[0,math.sin(psi),math.cos(psi)]])
R = np.linalg.inv(R)
x_new = np.matmul(R,x)
return x_new
def transform_camera_to_baseboard(position_camera):
#
# Parameters:
# position_camera: location in camera coordinates
new_position = [position_camera[0],-position_camera[1],-position_camera[2]]
return new_position
To calculate the optimal backboard orientation, we first simulated the ball flying towards the board in the initial position, from there we could see if the ball would be able to make contact with the backboard. We take this point of contact and then find the necessary direction the ball needs to go in order to make a basket. This is calculated by finding the unit vector between the point of contact and the center of the hoop. We then iterate through the workspace of the backboard to find orientations that could cause a post-collision velocity within a reasonable margin of error to score a basket. We take a linear approximation of ball flight post-collision due to the shortness of path between the backboard and hoop.
# Harrison Hidalgo
# ECE 5725 - Final Project
#
import iteration_variables as I_V
import numpy as np
import transformations
from path_tracker import *
import Geometric_Variables as GV
def system_iterator(e_b,r_B0,r_GB0,h,x_ball_init,front,up,W_of_backboard,H_of_backboard,T_of_backboard,r_of_ball,center_hoop):
# Cycle through different theta, psi, phi, and B positions
number_of_divisions = 20
# Create vectors to iterate through
#theta_iter = np.linspace(I_V.theta_min,I_V.theta_max,number_of_divisions)
phi_iter = np.linspace(I_V.phi_min,I_V.phi_max,number_of_divisions)
psi_iter = np.linspace(I_V.psi_min,I_V.psi_max,number_of_divisions)
B_iter = np.linspace(I_V.B_min,I_V.B_max,number_of_divisions)
does_it_work=False
hits,location,velocity_at_location = path_tracker(h,x_ball_init,GV.front,GV.up,GV.W_of_backboard,GV.H_of_backboard,GV.T_of_backboard,0.06542,GV.center_hoop,GV.r_B0)
if hits:
print('this may work')
need_vel_direction = (center_hoop-location)/np.linalg.norm(center_hoop-location)
else:
does_it_work = False
working_orientations = 0
return does_it_work,working_orientations
theta = 0
# For storing results
working_orientations = []
for j in range(0,number_of_divisions):# phi
for k in range(0,number_of_divisions):# psi
for l in range(0,number_of_divisions):
phi = phi_iter[j]
psi = psi_iter[k]
new_up = transformations.transform_baseboard_to_backboard(theta,phi,psi,up)
new_front = transformations.transform_baseboard_to_backboard(theta,phi,psi,front)
need_vel_direction = (center_hoop-location+B_iter[l]*e_b)/np.linalg.norm(center_hoop-location+B_iter[l]*e_b)
velocity_after = collision(new_up,new_front,velocity_at_location,1)
if np.dot(velocity_after,need_vel_direction) > 0.95*np.linalg.norm(need_vel_direction)+np.linalg.norm(velocity_after):
working_orientations = np.array([[theta_iter[i]],[phi_iter[j]],[psi_iter[k]]])
does_it_work=True
break
#working_orientations=np.reshape(working_orientations,(4,len(working_orientations)/4))
return does_it_work,working_orientations
def collision(up,front,velocity,restitution):
import numpy as np
# Calculate the left component of the backboard.
left = np.cross(front,up);
# Decompose velocity into backboard coordinates.
up_comp = np.dot(velocity,up)/np.dot(up,up)*up;
front_comp = np.dot(velocity,front)/np.dot(front,front)*front;
left_comp = np.dot(velocity,left)/np.dot(left,left)*left;
# Put velocity back together.
velocity_new = up_comp-front_comp*restitution+left_comp;
return velocity_new
We take the set of possible backboard orientations and pipe them into a script that computes the different motor orientations needed for each backboard orientation. Our algorithm is an iterative process that samples from our workspace and computes whether or not our physical constraints are upheld. If so, we deem the motor-backboard orientation combination as a possible one and add it to our set of possible orientations.
# Harrison Hidalgo
# ECE 5725 - Final Project
# This program parses through possible combos of theta1 theta2 and theta3
# to find necessary combo.
def find_angles(theta,phi,psi,r_B):
import numpy as np
import transformations
import math
import iteration_variables as IV
import Geometric_Variables as GV
theta_1 = np.array([])
theta_2 = np.array([])
theta_3 = np.array([])
N=5;
th1=np.linspace(IV.th1_start,IV.th1_end,N);
th2=np.linspace(IV.th2_start,IV.th2_end,N);
th3=np.linspace(IV.th3_start,IV.th3_end,N);
rP1= r_B-transformations.transform_baseboard_to_backboard(theta,phi,psi,(GV.rGB0+GV.rGP10))
rP2= r_B-transformations.transform_baseboard_to_backboard(theta,phi,psi,(GV.rGB0+GV.rGP20))
rP3= r_B-transformations.transform_baseboard_to_backboard(theta,phi,psi,(GV.rGB0+GV.rGP30))
does_it_work=False
for i in range(0,N):
for j in range(0,N):
for k in range(0,N):
th2_1=th1[i]
th2_2=th2[j]
th2_3=th3[k]
# Rotation about an arbitrary axis
motor_rod1 = np.dot(GV.motor_rod10,GV.motor_up1)*GV.motor_up1+math.cos(th2_1)*GV.motor_rod10+math.sin(th2_1)*np.cross(GV.motor_up1,GV.motor_rod10);
motor_rod2 = np.dot(GV.motor_rod20,GV.motor_up2)*GV.motor_up2+math.cos(th2_2)*GV.motor_rod20+math.sin(th2_2)*np.cross(GV.motor_up2,GV.motor_rod20);
motor_rod3 = np.dot(GV.motor_rod30,GV.motor_up3)*GV.motor_up3+math.cos(th2_3)*GV.motor_rod30+math.sin(th2_3)*np.cross(GV.motor_up3,GV.motor_rod30);
# Motor points
motor_point_1 = GV.gear2_1+motor_rod1
motor_point_2 = GV.gear2_2+motor_rod2
motor_point_3 = GV.gear2_3+motor_rod3
# Rods
r_P1_MP1 = motor_point_1 - rP1;
r_P2_MP2 = motor_point_2 - rP2;
r_P3_MP3 = motor_point_3 - rP3;
if (np.linalg.norm(r_P1_MP1-rP1)-GV.L1 < 0.1)and(np.linalg.norm(r_P2_MP2-rP2)-GV.L2 < 0.1)and(np.linalg.norm(r_P3_MP3-rP3)-GV.L3 < 0.1):
does_it_work=True
theta_1 = th2_1
theta_2 = th2_2
theta_3 = th2_3
break
return does_it_work,theta_1,theta_2,theta_3
Finally, we pipe the set of different motor orientations into a program that selects the optimal position to move to. This program uses mean-squared error, where our reference is the system’s current position. The orientation selected possesses the least error.
# Harrison Hidalgo
# ECE 5725 - Final Project
# Selects orientation to move to using mean-square error.
def orientation_selector(th_1_current,th_2_current,th_3_current,possible_orientations):
min_MSE = float('inf')
for i in range(0,len(possible_orientations[1,:])):
MSE_th1=(th_1_current-possible_orientations[0,i])^2
MSE_th2=(th_2_current-possible_orientations[1,i])^2
MSE_th3=(th_3_current-possible_orientations[2,i])^2
if (MSE_th1+MSE_th2+MSE_th3) < min_MSE:
theta1=possible_orientations[0,i]
theta2=possible_orientations[1,i]
theta3=possible_orientations[2,i]
min_MSE=MSE_th1+MSE_th2+MSE_th3
return theta1,theta2,theta3
Harrison Hidalgo contributed to this section of the project
On the real system, we created main_program.py that begins our sensor collection programs to be run on separate threads in the background. These start IMU, camera, and piTFT pygame functions that interface with our main program via .csv files. We also initialize the GPIO pins, interrupts for the limit switches, and variables.
Our main loop begins, and we collect our measurements by reading from the data outputted into our csv files. We use a couple of functions to process these measurements according to the state of the system. The system has 4 possible states: resetting the system according to the limit switches, collecting measurements, calculating desired movements of the backboard, moving the motors, and waiting until the basketball hits the backboard. We interface between the piTFT through interface_creater, which keeps track of the control actions in a .csv file and checks if a stop button is pressed, which would break our main_program loop variable.
A diagram explaining our main control loop is shown below:
Please click here for the non-embedded video explanation of our control software
Unfortunately, due to state estimation errors from our inaccurate camera mapping function, we were not able to get accurate enough ball velocity measurements for our system iterator function to reliably predict the optimal backboard angle. In many cases, there was no result, as the optimal state was not reachable. In order to combat this solution, we implemented a simpler function, called simple_main_program.py. This “Better Shot Backboard” system simply checks if the ball is veering left or right of center. If the ball is left of center, the backboard angles counter-clockwise; if the ball is right of center, the backboard angles clockwise.
Our Better Shot Backboard was able to successfully track and adjust to different ball trajectories. We would like to use more optimal control systems, but we believe we need more robust and expensive hardware to do so. Some improvements we would make are metal-machined linkages with less play, and a UV-joint that uses roller bearings. We would also add a more robust mounting plate and backboard. Improving our camera with the inclusion of an RGB-D camera or by mounting the camera in a different orientation (orthogonal to the backboard) would help capture the crucial depth measurement with more accuracy. We believe we can implement our more advanced filters and control algorithms if we made these improvements, which would afford us optimal control of the backboard angle
Please click here for the non-embedded video of shooting with the "Better Shot Backboard"
A link to our full code repository can be found here.
Our full simple_main_program.py code is shown below:
# Harrison Hidalgo and Chris Chan
# ECE 5725 - Final Project
# This is the main program for our system.
######## BEFORE WHILE LOOP #########
## Import modules
import time
import board
import digitalio
import RPi.GPIO as GPIO
import pygame
from pygame.locals import *
import os
import math
import Geometric_Variables as g
import Physical_Variables as p
from interface_variables import *
from system_iterator import *
from measurement_validation import *
import weights
import csv
from SR_SPF_Ball import SR_SPF_Ball
from covariances_small_enough import *
from motor_control import gearToStep
from motor_control import moveStepper
from adafruit_motor import stepper
from picamera.array import PiRGBArray
from picamera import PiCamera
from EKF_filter import *
import cv2
import imutils
import transformations
import numpy as np
## Addresses
image_data = 'image.csv'
data = np.zeros((3,7))
with open(image_data,'w') as csvfile:
csvwriter = csv.writer(csvfile)
csvwriter.writerows(data)
interface_states = 'interface_states.csv'
data = np.zeros((2,1))
with open(interface_states,'w') as csvfile:
csvwriter = csv.writer(csvfile)
csvwriter.writerows(data)
run_status = 'run.csv'
## Output to this file about the run status
run = True
with open(run_status,'w') as csvfile:
csvwriter = csv.writer(csvfile)
csvwriter.writerow([run])
## Start other scripts
os.system('python3 camera_transmitter.py &')
#os.system('python3 interface_creator.py &')
## Create constants
current_time = time.time()
start_time = time.time()
reset_time = 10 # time allowed between detected and system reset
run_time = 600
state = 0
motor_1_reset= False
motor_2_reset= False
motor_3_reset= False
# Motor GPIO pins and coils
coils = (
# motor 1
digitalio.DigitalInOut(board.D19), # A1
digitalio.DigitalInOut(board.D26), # A2
digitalio.DigitalInOut(board.D6), # B1
digitalio.DigitalInOut(board.D13), # B2
# motor 2
digitalio.DigitalInOut(board.D12), # A1
digitalio.DigitalInOut(board.D16), # A2
digitalio.DigitalInOut(board.D20), # B1
digitalio.DigitalInOut(board.D21), # B2
# motor 3
digitalio.DigitalInOut(board.D27), # A1
digitalio.DigitalInOut(board.D17), # A2
digitalio.DigitalInOut(board.D22), # B1
digitalio.DigitalInOut(board.D23), # B2
)
for coil in coils:
coil.direction = digitalio.Direction.OUTPUT
motor1 = stepper.StepperMotor(coils[0], coils[1], coils[2], coils[3], microsteps=None)
motor2 = stepper.StepperMotor(coils[4], coils[5], coils[6], coils[7], microsteps=None)
motor3 = stepper.StepperMotor(coils[8], coils[9], coils[10], coils[11], microsteps=None)
degPerStep = 1.8 # number of degrees per step of the motor
motor1Angle = 0
motor2Angle = 0
motor3Angle = 0
angleStepSize = math.pi/10
# Limit switch GPIO pins
LS_1 = 5
LS_2 = 14
LS_3 = 15
# Gear Specs
motorGearTeeth = 16
leverGearTeeth = 36
## Interrupts
GPIO.setmode(GPIO.BCM)
GPIO.setup(LS_1, GPIO.IN, pull_up_down=GPIO.PUD_UP)
GPIO.setup(LS_2, GPIO.IN, pull_up_down=GPIO.PUD_UP)
GPIO.setup(LS_3, GPIO.IN, pull_up_down=GPIO.PUD_UP)
def LIMIT_SWITCH_1(channel):
global motor_1_reset
motor_1_reset = True
def LIMIT_SWITCH_2(channel):
global motor_2_reset
motor_2_reset = True
def LIMIT_SWITCH_3(channel):
global motor_3_reset
motor_3_reset = True
GPIO.add_event_detect(LS_1,GPIO.RISING,callback=LIMIT_SWITCH_1,bouncetime=300)
GPIO.add_event_detect(LS_2,GPIO.RISING,callback=LIMIT_SWITCH_2,bouncetime=300)
GPIO.add_event_detect(LS_3,GPIO.RISING,callback=LIMIT_SWITCH_3,bouncetime=300)
## Ball Measurement Variables
x_hat = 0
t_last = None
ball_seen = False
thresh = 0.2
## EKF Variables
Q = np.identity(6)*1 # covariance of state noise
R = np.identity(6)*10 # covariance of measurement noise
Pk2k2 = np.identity(6)*100 # initialize the state covariance matrix
######## START WHILE LOOP ##########
while (current_time-start_time) < run_time and run:
## Update loop variable
current_time = time.time()
## Collect measurements
# Collect image from csv
camera_measurements = np.zeros((3,7))
with open(image_data,'r') as csvfile:
csvreader = csv.reader(csvfile)
row_num = 0
for row in csvreader:
if row_num >= 3:
row_num = 0
camera_measurements[row_num,:] = row
row_num = row_num + 1
## Process measurements
for i in range(0,3):
# Check if there's a ball
if (np.all(x_hat == 0) and not np.all(camera_measurements == 0)):
ball_seen = True
x_hat = np.array([[camera_measurements[1,0]]])
ball_in_sight = np.array([camera_measurements[1,6]])
# Check if there's a ball
if ball_seen and state == 1:
state = 2
## Process according to state
# State 0: Reset system.
# State 1: Collect measurements.
# State 2: Find position to move to.
# State 3: Move backboard to position.
# State 4: Pause until hit.
if state == 0:
# Check which limit switches have been reached
motors = []
direc = []
if motor_1_reset == False:
motors.append(motor1)
direc.append(stepper.BACKWARD)
if motor_2_reset == False:
motors.append(motor2)
direc.append(stepper.FORWARD)
if motor_3_reset == False:
motors.append(motor3)
direc.append(stepper.FORWARD)
# Move motors that haven't hit their limit
moveStepper(motors, [1]*len(motors), direc)
#print('resetting motors!!')
# Move onto next state once all motorrs have hit their limits
if motor_1_reset and motor_2_reset and motor_3_reset:
motor1Angle = 0
motor2Angle = 0
motor3Angle = 0
state = 1
time.sleep(1)
elif state == 2:
theta1_goal = 0
theta2_goal = 0
theta3_goal = 0
if x_hat[0,0] >= g.W_of_backboard/3:
if motor3Angle < math.pi/4-angleStepSize:
theta3_goal = angleStepSize
motor3Angle = motor3Angle + angleStepSize
if motor1Angle >= math.pi/12+angleStepSize:
theta1_goal = -angleStepSize
motor1Angle = motor1Angle - angleStepSize
elif x_hat[0,0] < g.W_of_backboard/3:
if motor1Angle < math.pi/4-angleStepSize:
theta1_goal = angleStepSize
motor1Angle = motor1Angle + angleStepSize
if motor3Angle >= math.pi/12+angleStepSize:
theta3_goal = -angleStepSize
motor3Angle = motor3Angle - angleStepSize
state = 3
elif state == 3:
# Use controller to find backboard movement
# Move backboard
steps = []
direc = []
motorNumber = 1
for angle in [theta1_goal, theta2_goal, theta3_goal]:
# find desired angle based on gearing
des_step = gearToStep(leverGearTeeth, motorGearTeeth, angle, degPerStep)
# determine motor direction
if motorNumber == 1:
if des_step < 0:
steps.append(abs(des_step))
direc.append(stepper.BACKWARD)
else:
steps.append(des_step)
direc.append(stepper.FORWARD)
if motorNumber == 2 or motorNumber == 3:
if des_step < 0:
steps.append(abs(des_step))
direc.append(stepper.FORWARD)
else:
steps.append(des_step)
direc.append(stepper.BACKWARD)
motorNumber = motorNumber + 1
moveStepper([motor1, motor2, motor3], steps, direc)
state = 4
end_state_3_time = time.time()
elif state == 4:
# Do nothing until ball hits
if (time.time()-end_state_3_time) > reset_time:
motor1.release()
motor2.release()
motor3.release()
state = 0
else:
state = 1
# Send to interfacer
interface_states = 'interface_states.csv'
data = np.array([ball_in_sight, [state]])
with open(interface_states,'w') as csvfile:
csvwriter = csv.writer(csvfile)
csvwriter.writerows(data)
# Check if a stop signal has been flagged from other programs
with open(run_status,'r') as csvfile:
csvreader = csv.reader(csvfile)
for row in csvreader:
run = row[0]
GPIO.cleanup()
Christopher Chan and Harrison Hidalgo contributed to this section of the project